Do not sleep on traditional machine learning

نویسندگان

چکیده

Over the last few years, research in automatic sleep scoring has mainly focused on developing increasingly complex deep learning architectures. However, recently these approaches achieved only marginal improvements, often at expense of requiring more data and expensive training procedures. Despite all efforts their satisfactory performance, staging solutions are not widely adopted a clinical context yet. We argue that most for limited real-world applicability as they hard to train, deploy, reproduce. Moreover, lack interpretability transparency, which key increase adoption rates. In this work, we revisit problem stage classification using classical machine learning. Results show competitive performance can be with conventional pipeline consisting preprocessing, feature extraction, simple model. particular, analyze linear model non-linear (gradient boosting) Our approach surpasses state-of-the-art (that uses same data) two public datasets: Sleep-EDF SC-20 (MF1 0.810) ST 0.795), while achieving results SC-78 0.775) MASS SS3 0.817). that, task, expressiveness an engineered vector is par internally learned representations models. This observation opens door adoption, representative allows leverage both successful track record traditional

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Do NLP and machine learning improve traditional readability formulas?

Readability formulas are methods used to match texts with the readers’ reading level. Several methodological paradigms have previously been investigated in the field. The most popular paradigm dates several decades back and gave rise to well known readability formulas such as the Flesch formula (among several others). This paper compares this approach (henceforth ”classic”) with an emerging par...

متن کامل

Traditional Perceptrons Do Not Produce the Overexpectation Effect

Perceptrons are typically viewed as being an artificial neural network that embodies the Rescorla-Wagner model of learning. One of the important properties of the Rescorla-Wagner model was its prediction of the overexpectation effect. However, we show below that a typical perceptron is not capable of generating this effect. This result brings into question assumed relationships between artifici...

متن کامل

Sleep does not enhance motor sequence learning.

Improvements in motor sequence performance have been observed after a delay involving sleep. This finding has been taken as evidence for an active sleep consolidation process that enhances subsequent performance. In a review of this literature, however, the authors observed 4 aspects of data analyses and experimental design that could lead to improved performance on the test in the absence of a...

متن کامل

Multi-Document Discourse Parsing Using Traditional and Hierarchical Machine Learning

Multi-document handling is essential today, when many documents on the same topic are produced, especially considering the Web. Both readers and computer applications can benefit from a discourse analysis of this multidocument content, since it demonstrates clearly the relations among portions of these documents. This work aims to identify such relations automatically using machine learning tec...

متن کامل

Do Not Copy Do Not Copy Do Not Copy Do Not Copy Do Not Copy Do Not Copy Do Not Copy Do Not Copy Do Not Copy

Design a clocked synchronous state machine with four inputs, G1–G4, that are connected to pushbuttons. The machine has four outputs, L1–L4, connected to lamps or LEDs located near the like-numbered pushbuttons. There is also an ERR output connected to a red lamp. In normal operation, the L1–L4 outputs display a 1-out-of-4 pattern. At each clock tick, the pattern is rotated by one position; the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Biomedical Signal Processing and Control

سال: 2023

ISSN: ['1746-8094', '1746-8108']

DOI: https://doi.org/10.1016/j.bspc.2022.104429